New quasi-Newton methods for unconstrained optimization problems

نویسندگان

  • Zengxin Wei
  • Guoyin Li
  • Liqun Qi
چکیده

Many methods for solving minimization problems are variants of Newton method, which requires the specification of the Hessian matrix of second derivatives. QuasiNewton methods are intended for the situation where the Hessian is expensive or difficult to calculate. Quasi-Newton methods use only first derivatives to build an approximate Hessian over a number of iterations. This approximation is updated each iteration by a matrix of low rank. In unconstrained minimization, the original quasi-Newton equation is Bk+1sk = yk, where yk is the difference of the gradients at the last two iterates. In this paper, we first propose a new quasi-Newton equation Bkþ1sk 1⁄4 y k in which y k is decided by the sum of yk and Aksk where Ak is some matrix. Then we give two choices of Ak which carry some second order information from the Hessian of the objective 0096-3003/$ see front matter 2005 Elsevier Inc. All rights reserved. doi:10.1016/j.amc.2005.08.027 * Corresponding author. E-mail addresses: [email protected] (Z. Wei), [email protected] (G. Li), maqilq@polyu. edu.hk (L. Qi). 1 The work of this author was done during his visit to the Department of Applied Mathematics, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong. His work is supported by the Croucher Foundation of Hong Kong, Chinese NSF grants 10161002 and Guangxi NSF grants 9811020. 2 The work of this author is supported by the Research Grant Council of Hong Kong. Z. Wei et al. / Appl. Math. Comput. 175 (2006) 1156–1188 1157 function. The three corresponding BFGS-TYPE algorithms are proved to possess global convergence property. The superlinear convergence of the one algorithm is proved. Extensive numerical experiments have been conducted which show that the proposed algorithms are very encouraging. 2005 Elsevier Inc. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Behavior of Damped Quasi-Newton Methods for Unconstrained Optimization

We consider a family of damped quasi-Newton methods for solving unconstrained optimization problems. This family resembles that of Broyden with line searches, except that the change in gradients is replaced by a certain hybrid vector before updating the current Hessian approximation. This damped technique modifies the Hessian approximations so that they are maintained sufficiently positive defi...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

A limited memory adaptive trust-region approach for large-scale unconstrained optimization

This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...

متن کامل

An efficient improvement of the Newton method for solving nonconvex optimization problems

‎Newton method is one of the most famous numerical methods among the line search‎ ‎methods to minimize functions. ‎It is well known that the search direction and step length play important roles ‎in this class of methods to solve optimization problems. ‎In this investigation‎, ‎a new modification of the Newton method to solve ‎unconstrained optimization problems is presented‎. ‎The significant ...

متن کامل

Modify the linear search formula in the BFGS method to achieve global convergence.

<span style="color: #333333; font-family: Calibri, sans-serif; font-size: 13.3333px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: justify; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; background-color: #ffffff; text-dec...

متن کامل

On the Behavior of Damped Quasi - Newton Methods for Unconstrained Optimization

We consider a family of damped quasi-Newton methods for solving unconstrained optimization problems. This family resembles that of Broyden with line searches, except that the change in gradients is replaced by a certain hybrid vector before updating the current Hessian approximation. This damped technique modifies the Hessian approximations so that they are maintained sufficiently positive defi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Applied Mathematics and Computation

دوره 175  شماره 

صفحات  -

تاریخ انتشار 2006